Chernoff bounds - définition. Qu'est-ce que Chernoff bounds
Diclib.com
Dictionnaire ChatGPT
Entrez un mot ou une phrase dans n'importe quelle langue 👆
Langue:     

Traduction et analyse de mots par intelligence artificielle ChatGPT

Sur cette page, vous pouvez obtenir une analyse détaillée d'un mot ou d'une phrase, réalisée à l'aide de la meilleure technologie d'intelligence artificielle à ce jour:

  • comment le mot est utilisé
  • fréquence d'utilisation
  • il est utilisé plus souvent dans le discours oral ou écrit
  • options de traduction de mots
  • exemples d'utilisation (plusieurs phrases avec traduction)
  • étymologie

Qu'est-ce (qui) est Chernoff bounds - définition

EXPONENTIALLY DECREASING BOUNDS ON TAIL DISTRIBUTIONS OF SUMS OF INDEPENDENT RANDOM VARIABLES
Chernoff's inequality; Chernoff bounds; Chernoff inequality; Matrix chernoff bound; Bernstein-Chernoff inequality
  • chi-square]] random variable

Chernoff bound         
In probability theory, the Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Despite being named after Herman Chernoff, the author of the paper it first appeared in, the result is due to Herman Rubin.
Matrix Chernoff bound         
Wikipedia talk:Articles for creation/Matrix Chernoff bound
For certain applications in linear algebra, it is useful to know properties of the probability distribution of the largest eigenvalue of a finite sum of random matrices. Suppose \{\mathbf{X}_k\} is a finite sequence of random matrices.
Paul Chernoff         
AMERICAN MATHEMATICIAN
Chernoff, Paul
Paul Robert Chernoff (21 June 1942, Philadelphia – 17 January 2017)biographical information from American Men and Women of Science, Thomson Gale 2004 was an American mathematician, specializing in functional analysis and the mathematical foundations of quantum mechanics. He is known for Chernoff's Theorem, a mathematical result in the Feynman path integral formulation of quantum mechanics.

Wikipédia

Chernoff bound

In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms the Chernoff or Chernoff-Cramér bound, which may decay faster than exponential (e.g. sub-Gaussian). It is especially useful for sums of independent random variables, such as sums of Bernoulli random variables.

The bound is commonly named after Herman Chernoff who described the method in a 1952 paper, though Chernoff himself attributed it to Herman Rubin. In 1938 Harald Cramér had published an almost identical concept now known as Cramér's theorem.

It is a sharper bound than the first- or second-moment-based tail bounds such as Markov's inequality or Chebyshev's inequality, which only yield power-law bounds on tail decay. However, when applied to sums the Chernoff bound requires the variates to be independent, a condition that is not required by either Markov's inequality or Chebyshev's inequality (although Chebyshev's inequality does require the variates to be pairwise independent).

The Chernoff bound is related to the Bernstein inequalities. It is also used to prove Hoeffding's inequality, Bennett's inequality, and McDiarmid's inequality.